Isobaric yield ratio difference and Shannon information entropy
نویسندگان
چکیده
منابع مشابه
Shannon Entropy , Renyi Entropy , and Information
This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.
متن کاملShannon Information Entropy in Heavy-ion Collisions
The general idea of information entropy provided by C.E. Shannon “hangs over everything we do” and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have ...
متن کاملShannon Entropy and Mutual Information for Multivariate SkewElliptical Distributions
The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study...
متن کاملHow well do practical information measures estimate the Shannon entropy?
Estimating the entropy of finite strings has applications in areas such as event detection, similarity measurement or in the performance assessment of compression algorithms. This report compares a variety of computable information measures for finite strings that may be used in entropy estimation. These include Shannon’s n-block entropy, the three variants of the Lempel-Ziv production complexi...
متن کاملOn the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physics Letters B
سال: 2015
ISSN: 0370-2693
DOI: 10.1016/j.physletb.2015.01.015